Note on posterior inference for the Bingham distribution
نویسندگان
چکیده
منابع مشابه
Exact Bayesian inference for the Bingham distribution
This paper is concerned with making Bayesian inference from data that are assumed to be drawn from a Bingham distribution. A barrier to the Bayesian approach is the parameter-dependent normalising constant of the Bingham distribution, which, even when it can be evaluated or accurately approximated, would have to be calculated at each iteration of an MCMC scheme, thereby greatly increasing the c...
متن کاملOn the Fisher-Bingham distribution
This paper primarily is concerned with the sampling of the Fisher–Bingham distribution and we describe a slice sampling algorithm for doing this. A by-product of this task gave us an infinite mixture representation of the Fisher–Bingham distribution; the mixing distributions being based on the Dirichlet distribution. Finite numerical approximations are considered and a sampling algorithm based ...
متن کاملAccurate Inference for the Mean of the Poisson-Exponential Distribution
Although the random sum distribution has been well-studied in probability theory, inference for the mean of such distribution is very limited in the literature. In this paper, two approaches are proposed to obtain inference for the mean of the Poisson-Exponential distribution. Both proposed approaches require the log-likelihood function of the Poisson-Exponential distribution, but the exact for...
متن کاملComplex Bingham Distribution for Facial Feature Detection
We present a novel method for facial feature point detection on images captured from severe uncontrolled environments based on a combination of regularized boosted classifiers and mixture of complex Bingham distributions. The complex Bingham distribution is a rotationinvariant shape representation that can handle pose, in-plane rotation and occlusion better than existing models. Additionally, w...
متن کاملPosterior distribution analysis for Bayesian inference in neural networks
This study explores the posterior predictive distributions obtained with various Bayesian inference methods for neural networks. The quality of the distributions is assessed both visually and quantitatively using Kullback–Leibler (KL) divergence, Kolmogorov–Smirnov (KS) distance and precision-recall scores. We perform the analysis using a synthetic dataset that allows for a more detailed examin...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Communications in Statistics - Theory and Methods
سال: 2017
ISSN: 0361-0926,1532-415X
DOI: 10.1080/03610926.2017.1346805